Search results for "Kullback–Leibler divergence"

showing 10 items of 17 documents

Extropy: Complementary Dual of Entropy

2015

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…

Bregman divergenceFOS: Computer and information sciencesStatistics and ProbabilitySettore MAT/06 - Probabilita' E Statistica MatematicaKullback–Leibler divergenceComputer Science - Information TheoryGeneral MathematicsFOS: Physical sciencesBinary numberMathematics - Statistics TheoryStatistics Theory (math.ST)Kullback–Leibler divergenceBregman divergenceproper scoring rulesGini index of heterogeneityDifferential entropyBinary entropy functionFOS: MathematicsEntropy (information theory)Statistical physicsDual functionAxiomMathematicsdifferential and relative entropy/extropy Kullback- Leibler divergence Bregman divergence duality proper scoring rules Gini index of heterogeneity repeat rate.Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniDifferential and relative entropy/extropyInformation Theory (cs.IT)Probability (math.PR)repeat ratePhysics - Data Analysis Statistics and ProbabilitydualityStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaMathematics - ProbabilityData Analysis Statistics and Probability (physics.data-an)Statistical Science
researchProduct

An entropy-based machine learning algorithm for combining macroeconomic forecasts

2019

This paper applies a Machine Learning approach with the aim of providing a single aggregated prediction from a set of individual predictions. Departing from the well-known maximum-entropy inference methodology, a new factor capturing the distance between the true and the estimated aggregated predictions presents a new problem. Algorithms such as ridge, lasso or elastic net help in finding a new methodology to tackle this issue. We carry out a simulation study to evaluate the performance of such a procedure and apply it in order to forecast and measure predictive ability using a dataset of predictions on Spanish gross domestic product.

Elastic net regularizationKullback–Leibler divergenceComputer scienceGeneral Physics and AstronomyInferencelcsh:Astrophysics02 engineering and technologyMachine learningcomputer.software_genremaximum-entropy inferenceArticleGDPGross domestic productlcsh:QB460-4660502 economics and business0202 electrical engineering electronic engineering information engineeringEntropy (information theory)lcsh:Science050205 econometrics combining predictionsaveragingMacroeconomiabusiness.industry05 social scienceslcsh:QC1-999Economia matemàticaTecnologiaKullback–Leiblerlcsh:Q020201 artificial intelligence & image processingArtificial intelligencebusinesscomputerAlgorithmlcsh:Physics
researchProduct

Reduced reference 3D mesh quality assessment based on statistical models

2015

International audience; During their geometry processing and transmission 3D meshes are subject to various visual processing operations like compression, watermarking, remeshing, noise addition and so forth. In this context it is indispensable to evaluate the quality of the distorted mesh, we talk here about the mesh visual quality (MVQ) assessment. Several works have tried to evaluate the MVQ using simple geometric measures, However this metrics do not correlate well with the subjective score since they fail to reflect the perceived quality. In this paper we propose a new objective metric to evaluate the visual quality between a mesh with a perfect quality called reference mesh and its dis…

Gamma distribution[ INFO ] Computer Science [cs]Kullback–Leibler divergenceKullback-Leibler divergencestatistical modelingContext (language use)02 engineering and technologyhuman visual systemDatabases[SPI]Engineering Sciences [physics][ SPI ] Engineering Sciences [physics]0202 electrical engineering electronic engineering information engineeringcomputational geometryPolygon mesh[INFO]Computer Science [cs]Divergence (statistics)MathematicsComputingMethodologies_COMPUTERGRAPHICSVisualizationbusiness.industry020207 software engineeringStatistical modelPattern recognitionstatistical distributionsDistortionGeometry processing3D triangle mesh[ SPI.TRON ] Engineering Sciences [physics]/Electronicsimage processing[SPI.TRON]Engineering Sciences [physics]/ElectronicsHuman visual system modelMetric (mathematics)Solid modelingThree-dimensional displays020201 artificial intelligence & image processingDistortion measurementWeibull distributionArtificial intelligencebusinessobjective metricQuality assessment
researchProduct

Image Quality Assessment Based on Intrinsic Mode Function Coefficients Modeling

2011

Reduced reference image quality assessment (RRIQA) methods aim to assess the quality of a perceived image with only a reduced cue from its original version, called ”reference image”. The powerful advantage of RR methods is their ”General-purpose”. However, most introduced RR methods are built upon a non-adaptive transform models. This can limit the scope of RR methods to a small number of distortion types. In this work, we propose a bi-dimensional empirical mode decomposition-based RRIQA method. First, we decompose both, reference and distorted images, into Intrinsic Mode Functions (IMF), then we use the Generalized Gaussian Density (GGD) to model IMF coefficients. Finally, the distortion m…

Kullback–Leibler divergenceImage qualityComputer sciencebusiness.industryPattern recognitionFunction (mathematics)Hilbert–Huang transformSupport vector machineDistortionHistogramStatisticsLimit (mathematics)Artificial intelligencebusiness
researchProduct

Image Quality Assessment Measure Based on Natural Image Statistics in the Tetrolet Domain

2012

This paper deals with a reduced reference (RR) image quality measure based on natural image statistics modeling. For this purpose, Tetrolet transform is used since it provides a convenient way to capture local geometric structures. This transform is applied to both reference and distorted images. Then, Gaussian Scale Mixture (GSM) is proposed to model subbands in order to take account statistical dependencies between tetrolet coefficients. In order to quantify the visual degradation, a measure based on Kullback Leibler Divergence (KLD) is provided. The proposed measure was tested on the Cornell VCL A-57 dataset and compared with other measures according to FR-TV1 VQEG framework.

Kullback–Leibler divergenceImage qualitybusiness.industryTetrolet transformPattern recognitionMeasure (mathematics)Domain (software engineering)Image (mathematics)GSMGaussian scale mixturesStatisticsArtificial intelligencebusinessMathematics
researchProduct

The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex

2018

The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as &ldquo

Kullback–Leibler divergenceSettore MAT/06 - Probabilita' E Statistica MatematicaLogarithmGeneral Physics and Astronomylcsh:Astrophysics02 engineering and technologyBregman divergenceMathematical proofInformation theory01 natural sciencesArticle010104 statistics & probabilityFermi–Dirac entropyKullback symmetric divergencelcsh:QB460-4660202 electrical engineering electronic engineering information engineeringEntropy (information theory)0101 mathematicslcsh:Sciencerelative entropy/extropyAxiomMathematics020206 networking & telecommunicationslcsh:QC1-999total logarithmic scoring ruleProbability distributiondualityPareto optimal exchangelcsh:QprevisionextropySettore SECS-S/01 - StatisticaentropyMathematical economicslcsh:PhysicsEntropy
researchProduct

A New Image Distortion Measure Based on Natural Scene Statistics Modeling

2012

In the field of Image Quality Assessment (IQA), this paper examines a Reduced Reference (RRIQA) measure based on the bi-dimensional empirical mode decomposition. The proposed measure belongs to Natural Scene Statistics (NSS) modeling approaches. First, the reference image is decomposed into Intrinsic Mode Functions (IMF); the authors then use the Generalized Gaussian Density (GGD) to model IMF coefficients distribution. At the receiver side, the same number of IMF is computed on the distorted image, and then the quality assessment is done by fitting error between the IMF coefficients histogram of the distorted image and the GGD estimate of IMF coefficients of the reference image, using the …

Kullback–Leibler divergencebusiness.industryImage qualityScene statisticsPattern recognition02 engineering and technology01 natural sciencesMeasure (mathematics)Hilbert–Huang transform010309 opticsSupport vector machineHistogramDistortion0103 physical sciences0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingArtificial intelligencebusinessMathematicsInternational Journal of Computer Vision and Image Processing
researchProduct

Comparing Correlation Matrix Estimators Via Kullback-Leibler Divergence

2011

We use a self-averaging measure called Kullback-Leibler divergence to evaluate the performance of four different correlation estimators: Fourier, Pearson, Maximum Likelihood and Hayashi-Yoshida estimator. The study uses simulated transaction prices for a large number of stocks and different data generating mechanisms, including synchronous and non-synchronous transactions, homogeneous and heterogeneous inter-transaction time. Different distributions of stock returns, i.e. multivariate Normal and multivariate Student's t-distribution, are also considered. We show that Fourier and Pearson estimators are equivalent proxies of the `true' correlation matrix within all the settings under analysis…

Minimum-variance unbiased estimatorEfficient estimatorKullback–Leibler divergenceConsistent estimatorStatisticsEstimatorMultivariate normal distributionTrimmed estimatorInvariant estimatorMathematicsSSRN Electronic Journal
researchProduct

Correlation, hierarchies, and networks in financial markets

2010

We discuss some methods to quantitatively investigate the properties of correlation matrices. Correlation matrices play an important role in portfolio optimization and in several other quantitative descriptions of asset price dynamics in financial markets. Specifically, we discuss how to define and obtain hierarchical trees, correlation based trees and networks from a correlation matrix. The hierarchical clustering and other procedures performed on the correlation matrix to detect statistically reliable aspects of the correlation matrix are seen as filtering procedures of the correlation matrix. We also discuss a method to associate a hierarchically nested factor model to a hierarchical tre…

Organizational Behavior and Human Resource ManagementEconomics and EconometricsPhysics - Physics and SocietyCorrelation based networkKullback–Leibler divergenceStability (learning theory)FOS: Physical sciencesKullback–Leibler distancePhysics and Society (physics.soc-ph)computer.software_genreHierarchical clusteringFOS: Economics and businessCorrelationMultivariate analysis Hierarchical clustering Correlation based networks Bootstrap validation Factor models Kullback–Leibler distancePortfolio Management (q-fin.PM)Bootstrap validationQuantitative Finance - Portfolio ManagementMathematicsFactor analysisStatistical Finance (q-fin.ST)Covariance matrixMultivariate analysiQuantitative Finance - Statistical FinanceHierarchical clusteringFactor modelTree (data structure)Physics - Data Analysis Statistics and ProbabilityData miningPortfolio optimizationcomputerData Analysis Statistics and Probability (physics.data-an)
researchProduct

Kullback-Leibler distance as a measure of the information filtered from multivariate data

2007

We show that the Kullback-Leibler distance is a good measure of the statistical uncertainty of correlation matrices estimated by using a finite set of data. For correlation matrices of multivariate Gaussian variables we analytically determine the expected values of the Kullback-Leibler distance of a sample correlation matrix from a reference model and we show that the expected values are known also when the specific model is unknown. We propose to make use of the Kullback-Leibler distance to estimate the information extracted from a correlation matrix by correlation filtering procedures. We also show how to use this distance to measure the stability of filtering procedures with respect to s…

Physics - Physics and SocietyKullback–Leibler divergenceStatistical Finance (q-fin.ST)Covariance matrixEXPRESSION DATAFOS: Physical sciencesQuantitative Finance - Statistical FinanceMultivariate normal distributionPhysics and Society (physics.soc-ph)Measure (mathematics)Stability (probability)Hierarchical clusteringDistance correlationFOS: Economics and businessPhysics - Data Analysis Statistics and ProbabilityStatisticsTime seriesAlgorithmData Analysis Statistics and Probability (physics.data-an)MATRICESMathematics
researchProduct